Coordinate Descent Without Coordinates: Tangent Subspace Descent on Riemannian Manifolds
Nam Ho-Nguyen (University of Sydney)
Abstract: We consider an extension of the coordinate descent algorithm to manifold domains, and provide convergence analyses for geodesically convex and non-convex smooth objective functions. Our key insight is to draw an analogy between coordinate blocks in Euclidean space and tangent subspaces of a manifold. Hence, our method is called tangent subspace descent (TSD). The core principle behind ensuring convergence of TSD is the appropriate choice of subspace at each iteration. To this end, we propose two novel conditions: the gap ensuring and $C$-randomized norm conditions on deterministic and randomized modes of subspace selection respectively. These ensure convergence for smooth functions, and are satisfied in practical contexts. We propose two subspace selection rules of particular practical interest that satisfy these conditions: a deterministic one for the manifold of square orthogonal matrices, and a randomized one for the more general Stiefel manifold. (This is joint work with David Huckleberry Gutman, Texas Tech University.)
optimization and control
Audience: researchers in the topic
Variational Analysis and Optimisation Webinar
Series comments: Register on www.mocao.org/va-webinar/ to receive information about the zoom connection.
| Organizers: | Hoa Bui*, Matthew Tam*, Minh Dao, Alex Kruger, Vera Roshchina*, Guoyin Li |
| *contact for this listing |
